Discriminative Direction for Kernel Classifiers
نویسنده
چکیده
In many scientific and engineering applications, detecting and understanding differences between two groups of examples can be reduced to a classical problem of training a classifier for labeling new examples while making as few mistakes as possible. In the traditional classification setting, the resulting classifier is rarely analyzed in terms of the properties of the input data captured by the discriminative model. However, such analysis is crucial if we want to understand and visualize the detected differences. We propose an approach to interpretation of the statistical model in the original feature space that allows us to argue about the model in terms of the relevant changes to the input vectors. For each point in the input space, we define a discriminative direction to be the direction that moves the point towards the other class while introducing as little irrelevant change as possible with respect to the classifier function. We derive the discriminative direction for kernel-based classifiers, demonstrate the technique on several examples and briefly discuss its use in the statistical shape analysis, an application that originally motivated this work.
منابع مشابه
Probabilistic Discriminative Kernel Classifiers for Multi-class Problems
Logistic regression is presumably the most popular representative of probabilistic discriminative classifiers. In this paper, a kernel variant of logistic regression is introduced as an iteratively re-weighted least-squares algorithm in kernel-induced feature spaces. This formulation allows us to apply highly efficient approximation methods that are capable of dealing with large-scale problems....
متن کاملIncorporating Probability into Support Vector Machine for Speaker Recognition
Support Vector Machines (SVMs) is basically a discriminative classifiers, while it is hopefully that incorporating probability into SVMs will achieve better performance. This paper briefly reviews some of the methods that can be used to carry out the combination. By following one of them, we make it suitable for the task of speaker recognition, and Gaussian Mixture Models (GMM) is used as the g...
متن کاملDiscriminative Gaussian Mixture Models: A Comparison with Kernel Classifiers
We show that a classifier based on Gaussian mixture models (GMM) can be trained discriminatively to improve accuracy. We describe a training procedure based on the extended Baum-Welch algorithm used in speech recognition. We also compare the accuracy and degree of sparsity of the new discriminative GMM classifier with those of generative GMM classifiers, and of kernel classifiers, such as suppo...
متن کاملA New Discriminative Kernel From Probabilistic Models
Recently, Jaakkola and Haussler (1999) proposed a method for constructing kernel functions from probabilistic models. Their so-called Fisher kernel has been combined with discriminative classifiers such as support vector machines and applied successfully in, for example, DNA and protein analysis. Whereas the Fisher kernel is calculated from the marginal log-likelihood, we propose the TOP kernel...
متن کاملAn Empirical Comparison of Kernel-Based and Dissimilarity-Based Feature Spaces
The aim of this paper is to find an answer to the question: What is the difference between dissimilarity-based classifications(DBCs) and other kernelbased classifications(KBCs)? In DBCs [11], classifiers are defined among classes; they are not based on the feature measurements of individual objects, but rather on a suitable dissimilarity measure among them. In KBCs [15], on the other hand, clas...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2001